Bounding Sample Size with the Vapnik-Chervonenkis Dimension

نویسندگان

  • John Shawe-Taylor
  • Martin Anthony
  • Norman Biggs
چکیده

A proof that a concept is learnable provided the Vapnik-Chervonenkis dimension is finite is given. The proof is more explicit than previous proofs and introduces two new parameters which allow bounds on the sample size obtained to be improved by a factor of approximately 4log2(e).

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Vapnik-chervonenkis Dimension 1 Vapnik-chervonenkis Dimension

Valiant’s theorem from the previous lecture is meaningless for infinite hypothesis classes, or even classes with more than exponential size. In 1968, Vladimir Vapnik and Alexey Chervonenkis wrote a very original and influential paper (in Russian) [5, 6] which allows us to estimate the sample complexity for infinite hypothesis classes too. The idea is that the size of the hypothesis class is a p...

متن کامل

Statistical Learning of Arbitrary Computable Classifiers

Statistical learning theory chiefly studies restricted hypothesis classes, particularly those with finite Vapnik-Chervonenkis (VC) dimension. The fundamental quantity of interest is the sample complexity: the number of samples required to learn to a specified level of accuracy. Here we consider learning over the set of all computable labeling functions. Since the VC-dimension is infinite and a ...

متن کامل

Relating Data Compression and Learnability

We explore the learnability of two-valued functions from samples using the paradigm of Data Compression. A first algorithm (compression) choses a small subset of the sample which is called the kernel. A second algorithm predicts future values of the function from the kernel, i.e. the algorithm acts as an hypothesis for the function to be learned. The second algorithm must be able to reconstruct...

متن کامل

Quantifying Generalization in Linearly Weighted Neural Networks

Abst ract . Th e Vapn ik-Chervonenkis dimension has proven to be of great use in the theoret ical study of generalizat ion in artificial neural networks. Th e "probably approximately correct" learning framework is described and the importance of the Vapnik-Chervonenkis dimension is illustrated. We then investigate the Vapnik-Chervonenkis dimension of certain types of linearly weighted neural ne...

متن کامل

Ellipsoidal Kernel Machines

A novel technique is proposed for improving the standard Vapnik-Chervonenkis (VC) dimension estimate for the Support Vector Machine (SVM) framework. The improved VC estimates are based on geometric arguments. By considering bounding ellipsoids instead of the usual bounding hyperspheres and assuming gap-tolerant classifiers, a linear classifier with a given margin is shown to shatter fewer point...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Discrete Applied Mathematics

دوره 42  شماره 

صفحات  -

تاریخ انتشار 1993